4 research outputs found

    PARLOMA – A Novel Human-Robot Interaction System for Deaf-blind Remote Communication

    Get PDF
    Deaf-blindness forces people to live in isolation. Up to now there is no existing technological solution enabling two (or many) Deaf-blind persons to communicate remotely among them in tactile Sign Language (t-SL). When resorting to t-SL, Deaf-blind persons can communicate only with persons physically present in the same place, because they are required to reciprocally explore their hands to exchange messages. We present a preliminary version of PARLOMA, a novel system to enable remote communication between Deaf-blind persons. It is composed of a low-cost depth sensor as the only input device, paired with a robotic hand as output device. Essentially, any user can perform handshapes in front of the depth sensor. The system is able to recognize a set of handshapes that are sent over the web and reproduced by an anthropomorphic robotic hand. PARLOMA can work as a “telephone” for Deaf-blind people. Hence, it will dramatically improve life quality of Deaf-blind persons. PARLOMA has been designed in strict collaboration with the main Italian Deaf-blind associations, in order to include end-users in the design phase

    Vision-Based Pose Estimation for Robot-Mediated Hand Telerehabilitation

    Get PDF
    Vision-based Pose Estimation (VPE) represents a non-invasive solution to allow a smooth and natural interaction between a human user and a robotic system, without requiring complex calibration procedures. Moreover, VPE interfaces are gaining momentum as they are highly intuitive, such that they can be used from untrained personnel (e.g., a generic caregiver) even in delicate tasks as rehabilitation exercises. In this paper, we present a novel master–slave setup for hand telerehabilitation with an intuitive and simple interface for remote control of a wearable hand exoskeleton, named HX. While performing rehabilitative exercises, the master unit evaluates the 3D position of a human operator’s hand joints in real-time using only a RGB-D camera, and commands remotely the slave exoskeleton. Within the slave unit, the exoskeleton replicates hand movements and an external grip sensor records interaction forces, that are fed back to the operator-therapist, allowing a direct real-time assessment of the rehabilitative task. Experimental data collected with an operator and six volunteers are provided to show the feasibility of the proposed system and its performances. The results demonstrate that, leveraging on our system, the operator was able to directly control volunteers’ hands movements

    Real-Time Single Camera Hand Gesture Recognition System for Remote Deaf-Blind Communication

    No full text
    This paper presents a fast approach for marker-less Full-DOF hand tracking, leveraging only depth information from a single depth camera. This system can be useful in many applications, ranging from tele-presence to remote control of robotic actuators or interaction with 3D virtual environment. We applied the proposed technology to enable remote transmission of signs from Tactile Sing Languages (i.e., Sign Languages with Tactile feedbacks), allowing non-invasive remote communication not only among deaf-blind users, but also with deaf, blind and hearing with proficiency in Sign Languages. We show that our approach paves the way to a fluid and natural remote communication for deaf-blind people, up to now impossible. This system is a first prototype for the PARLOMA project, which aims at designing a remote communication system for deaf-blind people
    corecore